GeForce FX Series
   HOME

TheInfoList



OR:

The GeForce FX or "GeForce 5" series (
codename A code name, call sign or cryptonym is a Code word (figure of speech), code word or name used, sometimes clandestinely, to refer to another name, word, project, or person. Code names are often used for military purposes, or in espionage. They may ...
d NV30) is a line of
graphics processing unit A graphics processing unit (GPU) is a specialized electronic circuit designed to manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. GPUs are used in embedded systems, mobi ...
s from the manufacturer
Nvidia Nvidia CorporationOfficially written as NVIDIA and stylized in its logo as VIDIA with the lowercase "n" the same height as the uppercase "VIDIA"; formerly stylized as VIDIA with a large italicized lowercase "n" on products from the mid 1990s to ...
.


Overview

Nvidia's GeForce FX series is the fifth generation of the
GeForce GeForce is a brand of graphics processing units (GPUs) designed by Nvidia. As of the GeForce 40 series, there have been eighteen iterations of the design. The first GeForce products were discrete GPUs designed for add-on graphics boards, inte ...
line. With
GeForce 3 The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-ali ...
, the company introduced programmable shader functionality into their 3D architecture, in line with the release of Microsoft's
DirectX Microsoft DirectX is a collection of application programming interfaces (APIs) for handling tasks related to multimedia, especially game programming and video, on Microsoft platforms. Originally, the names of these APIs all began with "Direct", ...
8.0. The GeForce 4 Ti was an enhancement of the GeForce 3 technology. With real-time
3D graphics 3D computer graphics, or “3D graphics,” sometimes called CGI, 3D-CGI or three-dimensional computer graphics are graphics that use a three-dimensional representation of geometric data (often Cartesian) that is stored in the computer for the ...
technology continually advancing, the release of DirectX 9.0 brought further refinement of programmable pipeline technology with the arrival of
Shader Model The High-Level Shader Language or High-Level Shading Language (HLSL) is a proprietary shading language developed by Microsoft for the Direct3D 9 API to augment the shader assembly language, and went on to become the required shading language ...
2.0. The GeForce FX series is Nvidia's first generation Direct3D 9-compliant hardware. The series was manufactured on
TSMC Taiwan Semiconductor Manufacturing Company Limited (TSMC; also called Taiwan Semiconductor) is a Taiwanese multinational corporation, multinational semiconductor contract manufacturing and design company. It is the world's most valuable semicon ...
's 130 nm fabrication process. It is compliant with Shader Model 2.0/2.0A, allowing more flexibility in complex shader/fragment programs and much higher arithmetic precision. It supports a number of new memory technologies, including DDR2,
GDDR2 Double Data Rate 2 Synchronous Dynamic Random-Access Memory (DDR2 SDRAM) is a double data rate (DDR) synchronous dynamic random-access memory (SDRAM) interface. It superseded the original DDR SDRAM specification, and was itself superseded by DDR3 ...
and
GDDR3 GDDR3 SDRAM (Graphics Double Data Rate 3 SDRAM) is a type of DDR SDRAM specialized for graphics processing units (GPUs) offering less access latency and greater device bandwidths. Its specification was developed by ATI Technologies in collabor ...
and saw Nvidia's first implementation of a memory data bus wider than 128 bits. The anisotropic filtering implementation has potentially higher quality than previous Nvidia designs. Anti-aliasing methods have been enhanced and additional modes are available compared to GeForce 4. Memory bandwidth and fill-rate optimization mechanisms have been improved. Some members of the series offer double fill-rate in
z-buffer A depth buffer, also known as a z-buffer, is a type of data buffer used in computer graphics to represent depth information of objects in Three-dimensional space, 3D space from a particular Perspective (graphical), perspective. Depth buffers are ...
/stencil-only passes. The series also brought improvements to the company's video processing hardware, in the form of the Video Processing Engine (VPE), which was first deployed in the GeForce 4 MX. The primary addition, compared to previous Nvidia GPUs, was per-pixel video-
deinterlacing Deinterlacing is the process of converting interlaced video into a non-interlaced or Progressive scan, progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some D ...
. The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooler. Called "Flow FX", the cooler was very large in comparison to ATI's small, single-slot cooler on the 9700 series. It was jokingly referred to as the "Dustbuster", due to a high level of fan noise. The advertising campaign for the GeForce FX featured the
Dawn
', which was the work of several veterans from the computer animation Final Fantasy: The Spirits Within. Nvidia touted it as "The Dawn of Cinematic Computing". Nvidia debuted a new campaign to motivate developers to optimize their titles for Nvidia hardware at the
Game Developers Conference The Game Developers Conference (GDC) is an annual conference for video game developers. The event includes an expo, networking events, and awards shows like the Game Developers Choice Awards and Independent Games Festival, and a variety of tutori ...
(GDC) in 2002. In exchange for prominently displaying the Nvidia logo on the outside of the game packaging, the company offered free access to a state-of-the-art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to Nvidia engineers, who helped produce code optimized for the company's products. Hardware based on the NV30 project didn't launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture.


Overall performance

GeForce FX is an architecture designed with DirectX 7, 8 and 9 software in mind. Its performance for DirectX 7 and 8 was generally equal to ATI's competing products with the mainstream versions of the chips, and somewhat faster in the case of the 5900 and 5950 models, but it is much less competitive across the entire range for software that primarily uses DirectX 9 features.Cross, Jason
Benchmarking Half-Life 2: ATI vs. NVIDIA
ExtremeTech, November 29, 2004.
Its weak performance in processing Shader Model 2 programs is caused by several factors. The NV3x design has less overall parallelism and calculation throughput than its competitors.Demirug
CineFX (NV30) Inside
3DCenter, August 31, 2003.
It is more difficult, compared to
GeForce 6 The GeForce 6 series (codename NV40) is Nvidia's sixth generation of GeForce graphic processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, '' SLI'' technology, and ''Shader Model 3.0 ...
and
ATI Radeon R300 series The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in ''Radeon'' graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in featur ...
, to achieve high efficiency with the architecture due to architectural weaknesses and a resulting heavy reliance on optimized pixel shader code. While the architecture was compliant overall with the DirectX 9 specification, it was optimized for performance with 16-bit shader code, which is less than the 24-bit minimum that the standard requires. When 32-bit shader code is used, the architecture's performance is severely hampered. Proper instruction ordering and instruction composition of shader code is critical for making the most of the available computational resources.


Hardware refreshes and diversification

Nvidia's initial release, the GeForce FX 5800, was intended as a high-end part. At the time, there were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range. In April 2003, the company introduced the GeForce FX 5600 and the GeForce FX 5200 to address the other market segments. Each had an "Ultra" variant and a slower, budget-oriented variant and all used conventional single-slot cooling solutions. The 5600 Ultra had respectable performance overall but it was slower than the Radeon 9600 Pro and sometimes slower than the GeForce 4 Ti series. The FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440 or Radeon 9000 Pro in some benchmarks. In May 2003, Nvidia launched the GeForce FX 5900 Ultra, a new high-end product to replace the low-volume and disappointing FX 5800. Based upon a revised GPU called NV35, which fixed some of the DirectX 9 shortcomings of the discontinued NV30, this product was more competitive with the Radeon 9700 and 9800. In addition to redesigning parts of the GPU, the company moved to a 256-bit memory data bus, allowing for significantly higher memory bandwidth than the 5800 even when utilizing more common DDR SDRAM instead of DDR2. The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800. In October 2003, Nvidia released the GeForce FX 5700 and GeForce FX 5950. The 5700 was a mid-range card using the NV36 GPU with technology from NV35 while the 5950 was a high-end card again using the NV35 GPU but with additional clock speed. The 5950 also featured a redesigned version of the 5800's FlowFX cooler, this time using a larger, slower fan and running much quieter as a result. The 5700 provided strong competition for the Radeon 9600 XT in games limited to light use of shader model 2. The 5950 was competitive with the Radeon 9800 XT, again as long as pixel shaders were lightly used. In December 2003, the company launched the GeForce FX 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900 Ultra, but clocked slower and used slower memory. It more thoroughly competed with Radeon 9600 XT, but was still behind in a few shader-intense scenarios. The GeForce FX line moved to
PCI Express PCI Express (Peripheral Component Interconnect Express), officially abbreviated as PCIe or PCI-e, is a high-speed serial computer expansion bus standard, designed to replace the older PCI, PCI-X and AGP bus standards. It is the common ...
in early 2004 with a number of models, including the PCX 5300, PCX 5750, PCX 5900, and PCX 5950. These cards were largely the same as their AGP predecessors with similar model numbers. To operate on the PCIe bus, an AGP-to-PCIe " HSI bridge" chip on the video card converted the PCIe signals into AGP signals for the GPU. Also in 2004, the GeForce FX 5200 / 5300 series that utilized the NV34 GPU received a new member with the FX 5500.


Discontinued support

NVIDIA has ceased driver support for GeForce FX series. * Windows 9x & Windows Me: 81.98 released on December 21, 2005
Download


* Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on June 23, 2008

(Products supported list also on this page) :Note that the 175.19 driver is known to break Windows Remote Desktop (RDP). The last version before the problem is 174.74. It was apparently fixed in 177.83, although this version is not available for the GeForce 5 graphic cards.AnandTech forum post regarding RDP issue
/ref> Also worth of note is that 163.75 is the last known good driver, that correctly handles the adjustment of the video overlay color properties for the GeForce FX Series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx). * Windows XP (32-bit): 175.40 released on August 1, 2008
Download
* Windows Vista: 96.85 released on October 17, 2006



* Windows Vista: 97.34 released on November 21, 2006
Download


See also

*
Comparison of Nvidia graphics processing units This list contains general information about graphics processing units (GPUs) and video cards from Nvidia, based on official specifications. In addition some Nvidia motherboards come with integrated onboard GPUs. Limited/Special/Collectors' Editio ...
* Rankine (microarchitecture) * GeForce 4 Series *
GeForce 6 Series The GeForce 6 series (codename NV40) is Nvidia's sixth generation of GeForce graphics processing unit, graphic processing units. Launched on April 14, 2004, the GeForce 6 family introduced Nvidia PureVideo, PureVideo post-processing for video, ...
* GeForce 7 Series


References


External links


Nvidia: Cinematic Computing for Every User






Picture and specifications for the FX5800
Driver Downloads

laptopvideo2go.com
Contains an archive of drivers and modified .INF files for the GeForce FX series
techPowerUp! GPU Database
{{DEFAULTSORT:Geforce Fx Series Computer-related introductions in 2003 Graphics cards
5 series The BMW 5 Series is an executive car manufactured and marketed by BMW since 1972, succeeding the New Class Sedans, and currently in its seventh generation. The 5 Series was initially available as a sedan, with a wagon/estate body style (market ...